- 
                Notifications
    You must be signed in to change notification settings 
- Fork 2.5k
feat: Add Chat Generator supporting OpenAI Responses API #9808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
| Pull Request Test Coverage Report for Build 18779147146Warning: This coverage report may be inaccurate.This pull request's base commit is no longer the HEAD commit of its target branch. This means it includes changes from outside the original pull request, including, potentially, unrelated coverage changes. 
 Details
 
 
 💛 - Coveralls | 
…tack into openai-responses
        
          
                releasenotes/notes/add-openai-responses-chatgenerator-52ca7457a4e61db1.yaml
          
            Show resolved
            Hide resolved
        
      | The latest updates on your projects. Learn more about Vercel for GitHub. 
 | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Amnah199 left some initial comments; didn't go into details yet
        
          
                releasenotes/notes/add-openai-responses-chatgenerator-52ca7457a4e61db1.yaml
          
            Show resolved
            Hide resolved
        
      | Added the OpenAIResponsesChatGenerator, a new component that integrates OpenAI's Responses API into Haystack. | ||
| This unlocks several advanced capabilities from the Responses API: | ||
| - Allowing retrieval of concise summaries of the model's reasoning process. | ||
| - Allowing the use of native OpenAI or MCP tool formats, along with Haystack Tool objects and Toolset instances. | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What else do we support and integrate well from our ecosystem into Responses?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Structured outputs could be one.
| Completes chats using OpenAI's Responses API. | ||
| It works with the gpt-4 and o-series models and supports streaming responses | ||
| from OpenAI API. It uses [ChatMessage](https://docs.haystack.deepset.ai/docs/chatmessage) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd mention here, including the o3-deep-research,
o4-mini-deep-research, and gpt-5-codex among others. People don't know this in general that you can use these specialized model/agents with this CG. Or do we don't want to mention that at this time, because they need to run usually in background etc
| OpenAI API. Use the `**generation_kwargs` argument when you initialize | ||
| the component or when you run it. Any parameter that works with | ||
| `openai.Responses.create` will work here too. | ||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we mention somewhere here what we don't support i.e background tasks etc
| Please merge main to update tests coverage information in #9808 (comment). Let's see if more unit tests are needed. | 
| We discussed with @Amnah199 about Function tool call  We considered two possible solutions: 
 Since I'll be on PTO next week, feel free to merge this PR when it's ready. | 
…tack into openai-responses
| @sjrl Would love your input here. However, if that’s unlikely (or a definite no), I can try to make the first option work instead. Context: Here’s the commit  showing what the first approach might look like. | 
Related Issues
Proposed Changes:
responsesendpoint.initandrunmethods allows openai and MCP tools besides haystack Tools.text_formatingeneration_kwargs.generation_kwargs.backgroundrequests etc, which are not supported in this iteration.How did you test it?
Notes for the reviewer
Using OpenAI tools and MCP tools doesn’t behave like standard function calls.
type="function_call". However, for OpenAI/MCP tools, the type value varies depending on the tool being used — for example,type="web_search_call".ResponseFunctionToolCallobjects. As a result, tool calls from OpenAI/MCP tools won’t appear in theChatMessage, so the user won’t see them.Checklist
fix:,feat:,build:,chore:,ci:,docs:,style:,refactor:,perf:,test:and added!in case the PR includes breaking changes.